"

Set 56 Problem number 19


Problem

A sound source is moving toward a microphone at 27 m/s, emitting sound at a constant frequency of 400 Hz (i.e., 400 pulses per second).  Sound travels at about 340 m/s.

If the source starts out 35 meters from the microphone, then how long does it take the first sound pulse to reach the microphone, and how long does it take the source to reach the microphone?

Solution

The source requires 35 meters /( 27 m/s) = 1.296 seconds to reach the detector; sound requires 35 m / (340 m/s) = .1029 seconds to reach the detector. 

The first pulse emitted at the 35 meter distance is not detected until .1029 seconds later, while the last pulse emitted is at the detector and is detected immediately.  Thus all the pulses are detected in .1029 seconds less than the time required to emit them.  Since the pulses are emitted in the 1.296 seconds required for the source to reach the detector, the time required for detection is

During the 1.296 seconds the number of pulses emitted is 1.296 sec * 400 cycles/sec = 518.4 pulses.  Thus 518.4 pulses are received in 1.193 seconds, and the observed frequency is

Generalized Solution

If a source moving toward an observer at velocity vSource emits sound at frequency f, and if sound travels at velocity vSound, then in any time interval `dt the source will move distance

and will emit f * `dt pulses.

The first pulse will reach the position at which the last pulse is emitted in the time required for sound to travel the distance vSource * `dt moved by the source.  The first pulse will therefore reach the position at which the last pulse is received after a delay of

Thus the f * `dt pulses will be received in time interval `dt - `dtPulse, and the observed frequency must be

= f `dt / (`dt - vSource * `dt / vSound) = f / (1 - vSource/vSound).

 

 

 

"